Global optimization using random embeddings
نویسندگان
چکیده
Abstract We propose a random-subspace algorithmic framework for global optimization of Lipschitz-continuous objectives, and analyse its convergence using novel tools from conic integral geometry. X-REGO randomly projects, in sequential or simultaneous manner, the high-dimensional original problem into low-dimensional subproblems that can then be solved with any global, even local, solver. estimate probability randomly-embedded subproblem shares (approximately) same optimum as problem. This success is used to show almost sure an approximate solution problem, under weak assumptions on (having strictly feasible solution) solver (guaranteed find reduced sufficiently high probability). In particular case unconstrained objectives low effective dimension, we variant explores random subspaces increasing dimension until finding leading globally converging after finite number embeddings, proportional dimension. numerically this efficiently finds both minimizer
منابع مشابه
Global optimization : partitioned random
We consider a combination of state space partitioning and random search methods for solving deterministic global optimization problem. We assume that function computations are costly and nding global optimum is diicult. Therefore, we may decide to stop searching long before we found a solution close to the optimum. Final reward of the algorithm is deened as the best found function value minus t...
متن کاملBayesian Optimization in High Dimensions via Random Embeddings
Bayesian optimization techniques have been successfully applied to robotics, planning, sensor placement, recommendation, advertising, intelligent user interfaces and automatic algorithm configuration. Despite these successes, the approach is restricted to problems of moderate dimension, and several workshops on Bayesian optimization have identified its scaling to high dimensions as one of the h...
متن کاملBayesian Optimization in a Billion Dimensions via Random Embeddings
Bayesian optimization techniques have been successfully applied to robotics, planning, sensor placement, recommendation, advertising, intelligent user interfaces and automatic algorithm configuration. Despite these successes, the approach is restricted to problems of moderate dimension, and several workshops on Bayesian optimization have identified its scaling to high-dimensions as one of the h...
متن کاملGlobal graph kernels using geometric embeddings
Applications of machine learning methods increasingly deal with graph structured data through kernels. Most existing graph kernels compare graphs in terms of features defined on small subgraphs such as walks, paths or graphlets, adopting an inherently local perspective. However, several interesting properties such as girth or chromatic number are global properties of the graph, and are not capt...
متن کاملRandom linear embeddings
This lemma states that any finite point set can be linearly embedded into Rk such that all pairwise distances are approximately preserved. What is remarkable is that the required target dimension k does not depend on the source dimension d at all. Therefore, any data analysis that depends only on interpoint Euclidean distances between n data points can be approximately carried out in dimension ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Mathematical Programming
سال: 2022
ISSN: ['0025-5610', '1436-4646']
DOI: https://doi.org/10.1007/s10107-022-01871-y